The Problem 2(47) Nonlinear System Identification: A Palette from Off-white to Pit-black. A Common Frame 4(47) This Presentation... 3(47) ˆθ = arg min

Size: px
Start display at page:

Download "The Problem 2(47) Nonlinear System Identification: A Palette from Off-white to Pit-black. A Common Frame 4(47) This Presentation... 3(47) ˆθ = arg min"

Transcription

1 y y y y y Tunn: measurement; Tjock: simulation [s] 6 OUTPUT # 6 - INPUT # OUTPUT # INPUT # - 6 The Problem (7) : A Palette from Off-white to Pit-black Missile Dynamics: Lennart Ljung Automatic Control, ISY, Linköpings Universitet Pulp Buffer Vessel: Forest Crane: Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 This Presentation... (7) A Common Frame (7) The world of nonlinear models is very diverse. A common framework: Discrete time observations of inputs and outputs:... aims at a display of the essence of the problem of non-linear identification a color-coded overview of typical parametric approaches Z t = [u(), u(),..., u(t), y(), y(),..., y(t)] A model is a parameterized predictor of the next output y(t) made at time t : ŷ(t t, θ) = ŷ(t θ) = h(z t, θ) The parameters can be estimated using the prediction error method: ˆθ = arg min θ (could be Maximum Likelihood) y(t) h(z t, θ) t Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

2 y y What s Special with Nonlinear Models? (7) The Model Surface 6(7) ŷ(t θ) = h(z t, θ) is a nonlinear function of Z. What makes the nonlinear problem much more difficult and rich than the linear problem? Two major reasons: The richness of the model surface Propagation of noise signals to the output not immediate Let us take Z t = [u(t ), u(t )] and a scalar output y(t). A model is then a surface in the space spanned by [y(t), u(t ), u(t )] and the estimation task is to estimate this surface u(t ) u(t ) Linear: Nonlinear: ŷ(t) = a u(t ) + a u(t ) ŷ(t) = h(u(t ), u(t )) 6 x u(t ) u(t ) The observations Z t are points in this space. Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Propagation of Noise Signals 7(7) The Palette of Nonlinear Models 8(7) In linear systems that are cascaded we can always propagate the noise signals to the output: y = Gu + He, where H picks up the coloring obtained by propagating the noise through a linear system. For nonlinear systems, this in generally not possible. Example: A linear system + noise, z = Gu + w is followed by a static nonlinearity f (z). At the output we have y(t) =f (Gu + w) = f (Gu) + w w =f (Gu + w) f (Gu) Here, w is not really a noise : It is clearly contaminated with the input u which will create bias-problems when minimizing the output error. Indicates that the calculation of the true predictor could be challenging. White: Known model Off-white: Careful Physical Modeling w or w/o noise models Smoke-grey: Semi-physical modeling (Could be used more!) Steel-grey: Local Linear Models Slate-grey: Block-oriented Models. Black: Flexible structures universal approximators Pit-black: Non-Parametric Smoothing Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

3 Off-white Models: Physical Modeling 9(7) Example: Missile (7) Perform physical modeling (e.g. in MODELICA) and denote unknown physical parameters by θ. Collect the model equations as ẋ(t)= f (x(t), u(t), θ) y(t)= h(x(t), u(t), θ) (or in DAE, Differential Algebraic Equations, form.) For each parameter θ this defines a simulated output ŷ(t θ) which is the parameterized function from sampled data ŷ(t θ) = h(z t, θ) (Z t = u t ) in somewhat implicit form. To be a correct predictor this really assumes white measurement noise. Then the estimate is the θ that minimizes the output error fit t y(t) ŷ(t θ) Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Nonlinear System inputs, Identification outputs, 6 unknown parameters. The Equations (7) Initial Fit between Model and Data (7) function [dx, y] = missile(t, x, p, u); MISSILE A non-linear missile system. Output equation. y = [x();... x();... x();... -p(8)u()(p()x()+p()u())/p();... -p(8)u()(p()x()+p()u())/p()... ]; State equations. dx = [/p(9)(p(7)p(8)(p()x()+.p(6)p(7)x()/u()+... % Angular velocity around x-axis. p(7)u())u()-(p()-p())x()x())+... p()(u(6)-x());... /p()(p(7)p(8)(p(8)x()+.p(9)p(7)x()/u() p()-p() unknown parameters u, y : measured inputs and outputs y y y y Tunn: measurement; Tjock: simulation y Lennart Ljung Brussels Workshop, April, [s] Lennart Ljung Brussels Workshop, April, 7

4 Adjusted Fit between Model and Data (7) Off-white Models with Noise Models (7) y y y y y Tunn: measurement; Tjock: simulation [s] Lennart Ljung Brussels Workshop, April, 7 The (output error, off-white) approach is conceptually simple, but could be very demanding in practice. A main shortcoming is the use of the output error criterion, which really assumes white measurement noise. Noise signals in nonlinear models cannot really be propagated to the output. If the size of the noise is non-trivial, more careful noise modeling should be done: ẋ(t)= f (x(t), u(t), w(t), θ) y(t)= h(x(t), u(t), θ) + e(t) where w and e are white noises.to find correctly predicted outputs ŷ(t Z t, θ) = E(y(t) Z t, θ) is then the well-known intractable problem of nonlinear filtering. Often one has to resort to some simplistic observer. Lennart Ljung Brussels Workshop, April, 7 Probabilistic Learning (7) The Palette of Nonlinear Models 6(7) Recently however, with the increasing computing power, new computing intensive simulation based methods have been developed for nonlinear filtering problem, and hence for applying the Maximum Likelihood method to non-linear state space models. Particle filtering, Markov Chain Monte Carlo, MCMC, Sequential Monte Carlo... Loosely, and briefly, these are based on simulation of the noisy state-space model, and evaluating the state probabilities, focusing on paths that give the measured output sequence. It is a central current research area, Probabilistic Learning, to make these calculations as efficient as possible. See e.g. Thomas B. Schön, Andreas Svensson, Lawrence Murray, and Fredrik Lindsten: Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo, ArXiv White: Known model Off-white: Careful Physical Modeling w or w/o noise models Smoke-grey: Semi-physical modeling (Could be used more!) Steel-grey: Local Linear Models Slate-grey: Block-oriented Models. Black: Flexible structures universal approximators Pit-black: Non-Parametric Smoothing Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

5 Smoke-grey: Semi-physical Models 7(7) Buffer Vessel Dynamics 8(7) OUTPUT # 8 6 Apply non-linear transformations to the measured data, so that the transformed data stand a better chance to describe the system in a linear relationship. Rules: Only high-school physics and max minutes Toy Example: Immersion heater: Input: voltage to the heater. Output: temperature of the fluid Square the voltage! Sense morale: No excuse for not thinking over the basic physical facts! Another example:... 6 INPUT # 6 κ-number of outflow, κ-number of inflow, 6 6 flow volume Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Linear Model Based on Raw Data 9(7) Now it s time to (7) 6 Measured Output and Simulated Model Output Measured Output mraw Fit:.% Think:... No mixing ( Plug flow ): The vessel is then just a pure time delay for the pulp flow: Delay time: Vessel Volume/Pulp Flow (dimension time.) y Dashed line: κ-number after the vessel, actual measurements. Solid line: Simulated κ-number using the input only and a process model estimated using the first data points. G(s) = s e 8s x Perfect mixing in tank: A text-book first order system with gain= and time constant = Volume/Flow So if Volume and Flow are changing, we have a non-linear system! The natural time variable is really Volume/Flow, (which we have measured). Let us re-sample the observed data according to this natural time variable. Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

6 y 6 6 Measured Output and Simulated Model Output Measured Output mraw Fit:.% Re-sample Data (7) Semi-physical Model (7) z = [y,u]; pf = flow./level; t = :length(z) newt = interp([cumsum(pf),t],[pf():sum(pf)] ); newz = interp([t,z], newt); κ number of Inflow 6 G(s) = s e 69.8s Measured Output and Simulated Model Output Measured Output mves Fit: 6.9% x Recall Linear model κ number of Outflow y 6 The semi-physical model gives a sufficiently good description of the buffer, to allow proper time-marking of the pulp before and after. 8 Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 The Palette of Nonlinear Models (7) White: Known model Off-white: Careful Physical Modeling w or w/o noise models Smoke-grey: Semi-physical modeling (Could be used more!) Steel-grey: Local Linear Models Slate-grey: Block-oriented Models. Black: Flexible structures universal approximators Pit-black: Non-Parametric Smoothing Steel-Grey: Composite Local Models (7) Non-linear systems are often handled by linearization around a working point. The idea behind Composite Local (Local Linear) Models is to deal with the nonlinearities by selecting or averaging over relevant linearized models. Example: Tank with inflow u and free outflow y and level h: (Bernoulli s) equations: ḣ = h + u; y = h Linearize around level h with corresponding flows u = y = h : ḣ = h (h h ) + (u u ); y = y + h (h h ) Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

7 Tank Example, ctd (7) Data and Linear Model 6(7) Sampled data model around level h (Sampling time T s ): Measured data: Linear Model (d = ) y(t) = γ(h ) + α(h )y(t T s ) + β(h )u(t T s ) = θ T (h )ϕ(t) An ARX-model with level-dependent parameters. Now compute linearized model for d different levels, h, h,..., h d. Total model: select or average over these local models ŷ(t) = d w k (h(t), h k )θ T (h k )ϕ(t) k= Inflöde u Nivå h Choices of weights w k :.... Thick line: Model. Thin: Measured. Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Local Linear Models 7(7) Composite Local Models: General Comments 8(7).. Two levels (models) (d=) Five levels (models) (d = ).. Let the measured working point variable (tank level in example) be denoted by ρ(t) (sometimes called regime variable or scheduling variable). If the regime variable is partitioned into d values ρ k, and model output according to value ρ k is ŷ (k) (t) the predicted output will be ŷ(t) = d w k (ρ(t), ρ k )ŷ (k) (t) k= If the prediction ŷ (k) (t) corresponding to ρ k is linear in the parameters, ŷ (k) (t) = ϕ T (t)θ (k), and the weights w are fixed, the whole model will be a linear regression. Important connections to active research areas LPV (Linear Parameter-Varying) Models Hybrid Models ( w(, ) is estimated too.) Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

8 LPV State Space Models 9(7) The Palette of Nonlinear Models (7) x(t + ) = A(ρ)x(t) + B(ρ)u(t) y(t) = C(ρ)x(t) is a linear model for each fixed ρ. If ρ Ω = {ρ,..., ρ d } it is a set of local linear models.if ρ = ρ(t) is time varying, we have a Linear Parameter Varying model. A basic difficulty is to find a common state basis from input/output observations and to manage the time variable in ρ(t) in relation to the observations. White: Known model Off-white: Careful Physical Modeling w or w/o noise models Smoke-grey: Semi-physical modeling (Could be used more!) Steel-grey: Local Linear Models Slate-grey: Block-oriented Models Black: Flexible structures universal approximators Pit-black: Non-Parametric Smoothing Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Slate Grey: Block-oriented Models (7) Common Models (7) Building Blocks: Wiener Linear Dynamic System G(s) Hammerstein Nonlinear static function f (u) Hammerstein- Wiener Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

9 Other Combinations (7) Active Research Field: Example: Hydraulic Crane Data (7) These are data from a forest harvest machine: OUTPUT # With the linear blocks parameterized as a linear dynamic system and the static blocks parameterized as a function ( curve ), this gives a parameterization of the output as ŷ(t θ) = h(z t, θ) and the general approach of model fitting can be applied. However, in this contexts many algorithmic variants have been suggested. Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, INPUT # 6 Input: Hydraulic Pressure. Output: Tip Position Linear Model (7) Hammerstein Model of the Hydraulic Crane 6(7) Black: Measured Output Blue: Model Simulated Output Linear model: Fit.7 % Hammerstein model: Fit 7.6 % The Hammerstein Model gives a good fit. The extra flexibility offered by the input nonlinearity is quite useful, (even though no direct physical explanation is obvious.) 6 8 Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

10 Noise Effects in Hammerstein-Wiener? 7(7) Output Error Method for the HW Model 8(7) There is frequently reason to assume that some noise enters before the output nonlinearity g. wt vt et Bode for linear system Input NLs Output NLs Input nonlinearity Saturation Input nonlinearity Deadzone Output nonlinearity Deadzone Output nonlinearity Satu G G Hw(q, µ) Hv(q, η) z w t z v t..... ut νt z g t xt f(, α) G(q, ϑ) g(, β) yt What happens if we propagate that noise to the output and apply an Output Error criterion to the above input output system? G. G OE Method OE ML NI Method Method OE ML NI ML EM Method Method Method ML NI OE ML EM Method Method Method Blue curve: Plots for the true system Red curves: Median and standard deviations for estimated systems over 8 Monte Carlo runs Number of observed data:.. ML NI ML EM Method Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Maximum Likelihood (EM) for HW Models 9(7) Results for the ML-EM Method for HW Models (7) It is clear that more effort must be paid to the noise structure. We turn to the Maximum Likelihood method for the HW model structure. It is a complication that the ML criterion cannot easily be formed. But if the unmeasured noise z w t were known it is easy to compute the ML criterion. So, treat is as incomplete data X and apply the EM algorithm, which iterates between estimating X and estimating the model for this X. Ref: Adrian Wills, Thomas B. Schön, Lennart Ljung, Brett Ninness: Identification of Hammerstein-Wiener models, Automatica Bode for linear system Input NLs Output NLs G.. G OE Method.. Input nonlinearity Saturation Input nonlinearity DeadzoneOutput nonlinearity DeadzoneOutput nonlinearity Saturation G OE G ML NI Method Method OE ML NI Method ML EM Method Method OE ML NI Method ML EM Method Method ML NI ML EM Method Method ML EM Method Blue curve: Plots for the true system Red curves: Median and standard deviations for estimated systems over 8 Monte Carlo runs Number of observed data:.... Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

11 The Palette of Nonlinear Models (7) Black-box Models (7) A general way to generate very flexible mappings from Z t to ŷ is White: Known model Off-white: Careful Physical Modeling w or w/o noise models Smoke-grey: Semi-physical modeling (Could be used more!) Steel-grey: Local Linear Models Slate-grey: Block-oriented Models. Black: Flexible structures universal approximators Pit-black: Non-Parametric Smoothing x(t + ) = f (x(t), u(t), y(t), θ) ŷ(t θ) = h(x(t), θ) where f and/or h are flexible functions e.g. in terms of basis function expansions. Working with both f and h may be too general, and a very common special case is the NLARX model: x(t) = ϕ(t) = [y(t ),..., y(t na), u(t ),... u(t nb)] T ŷ(t θ) = h(ϕ(t), θ) = d α k κ k (ϕ(t)) k= for some basis functions κ k Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Basis Functions (7) It is natural to think of Taylor expansions: κ k (ϕ) = ϕ k. If na = (NLFIR), this becomes the classical Volterra series expansion. But note that if dimϕ = r, then ϕ k has r k components! A more common choice is to form all the basis functions κ k from one mother function κ and scale and position the argument differently: κ k (ϕ) = κ(β k (ϕ γ k )) ŷ(t θ) = d α k κ(β k (ϕ γ k )), θ = {α k, β k, γ k } k= Intuitive picture: Think of a scalar ϕ and let κ(z) be a unit pulse for z. Then κ(β(ϕ γ)) is a pulse of width /β starting in ϕ = γ. The sum above is then a piecewise constant function, capable of approximation "any" function arbitrary well for large enough d. ANN, LS-SVM etc (Sjöberg et al, Automatica 99) Focusing on f (7) Basis functions for f, h: x(t + ) = f (x(t), u(t), θ) ŷ(t θ) = h(x(t), θ) Polynomial expansion (Paduart et al, Automatica, ) Gaussian Process (GP) model for f (Rasmussen, inverted pendulum experiments; x measured); [Basis expansion in terms of the eigenfunctions associated with the kernel (covariance function for the GP)] Sine basis; If process noise affects x, particle filtering must be applied to find the predictor (Svensson and Schön, Automatica 7) Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7

12 y The Palette of Nonlinear Models (7) Pit-black Models: Non-Parametric Smoothing Methods 6(7) x White: Known model Off-white: Careful Physical Modeling w or w/o noise models Smoke-grey: Semi-physical modeling (Could be used more!) Steel-grey: Local Linear Models Slate-grey: Block-oriented Models. Black: Flexible structures universal approximators Pit-black: Non-Parametric Smoothing 6 u(t ) Form the model surface h(ϕ(t)) by smoothing over the observation points in the space! Even Blacker! Huge literature Mostly in the statistical community and now also in machine learning Important to find lower dimensional manifolds ( counterpart of PCA in linear modelling). Concepts like Manifold Learning and Local Linear Embedding become central. u(t ) Lennart Ljung Brussels Workshop, April, 7 Lennart Ljung Brussels Workshop, April, 7 Conclusions 7(7) Confusingly many approaches! A user-oriented roadmap would be excellent! Lennart Ljung Brussels Workshop, April, 7

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET Identification of Linear and Nonlinear Dynamical Systems Theme : Nonlinear Models Grey-box models Division of Automatic Control Linköping University Sweden General Aspects Let Z t denote all available

More information

Identification of Non-linear Dynamical Systems

Identification of Non-linear Dynamical Systems Identification of Non-linear Dynamical Systems Linköping University Sweden Prologue Prologue The PI, the Customer and the Data Set C: I have this data set. I have collected it from a cell metabolism experiment.

More information

System Identification: From Data to Model

System Identification: From Data to Model 1 : From Data to Model With Applications to Aircraft Modeling Division of Automatic Control Linköping University Sweden Dynamic Systems 2 A Dynamic system has an output response y that depends on (all)

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

A Mathematica Toolbox for Signals, Models and Identification

A Mathematica Toolbox for Signals, Models and Identification The International Federation of Automatic Control A Mathematica Toolbox for Signals, Models and Identification Håkan Hjalmarsson Jonas Sjöberg ACCESS Linnaeus Center, Electrical Engineering, KTH Royal

More information

Using Multiple Kernel-based Regularization for Linear System Identification

Using Multiple Kernel-based Regularization for Linear System Identification Using Multiple Kernel-based Regularization for Linear System Identification What are the Structure Issues in System Identification? with coworkers; see last slide Reglerteknik, ISY, Linköpings Universitet

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Outline lecture 6 2(35)

Outline lecture 6 2(35) Outline lecture 35 Lecture Expectation aximization E and clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 13-1373 Office: House

More information

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET

AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET The Problem Identification of Linear and onlinear Dynamical Systems Theme : Curve Fitting Division of Automatic Control Linköping University Sweden Data from Gripen Questions How do the control surface

More information

Machine Learning Techniques for Computer Vision

Machine Learning Techniques for Computer Vision Machine Learning Techniques for Computer Vision Part 2: Unsupervised Learning Microsoft Research Cambridge x 3 1 0.5 0.2 0 0.5 0.3 0 0.5 1 ECCV 2004, Prague x 2 x 1 Overview of Part 2 Mixture models EM

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Chapter 4 Neural Networks in System Identification

Chapter 4 Neural Networks in System Identification Chapter 4 Neural Networks in System Identification Gábor HORVÁTH Department of Measurement and Information Systems Budapest University of Technology and Economics Magyar tudósok körútja 2, 52 Budapest,

More information

Outline Lecture 3 2(40)

Outline Lecture 3 2(40) Outline Lecture 3 4 Lecture 3 Expectation aximization E and Clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 3-8373 Office: House

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Local Modelling of Nonlinear Dynamic Systems Using Direct Weight Optimization

Local Modelling of Nonlinear Dynamic Systems Using Direct Weight Optimization Local Modelling of Nonlinear Dynamic Systems Using Direct Weight Optimization Jacob Roll, Alexander Nazin, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet,

More information

Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data.

Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. ISS0031 Modeling and Identification Lecture 7: Discrete-time Models. Modeling of Physical Systems. Preprocessing Experimental Data. Aleksei Tepljakov, Ph.D. October 21, 2015 Discrete-time Transfer Functions

More information

Outline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42)

Outline 2(42) Sysid Course VT An Overview. Data from Gripen 4(42) An Introductory Example 2,530 3(42) Outline 2(42) Sysid Course T1 2016 An Overview. Automatic Control, SY, Linköpings Universitet An Umbrella Contribution for the aterial in the Course The classic, conventional System dentification Setup

More information

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 10 - System identification as a model building tool Experiment design Examination and prefiltering of data Model structure selection Model validation Lecture

More information

Approaches to Identification of Nonlinear Systems

Approaches to Identification of Nonlinear Systems Approaches to Identification of Nonlinear Systems Lennart Ljung 1 1. Division of Automatic Control, Linköping University, SE-58183 Linköping, Sweden E-mail: ljung@isy.liu.se Abstract: System Identification

More information

Linear Approximations of Nonlinear FIR Systems for Separable Input Processes

Linear Approximations of Nonlinear FIR Systems for Separable Input Processes Linear Approximations of Nonlinear FIR Systems for Separable Input Processes Martin Enqvist, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581

More information

Non-Factorised Variational Inference in Dynamical Systems

Non-Factorised Variational Inference in Dynamical Systems st Symposium on Advances in Approximate Bayesian Inference, 08 6 Non-Factorised Variational Inference in Dynamical Systems Alessandro D. Ialongo University of Cambridge and Max Planck Institute for Intelligent

More information

On Identification of Cascade Systems 1

On Identification of Cascade Systems 1 On Identification of Cascade Systems 1 Bo Wahlberg Håkan Hjalmarsson Jonas Mårtensson Automatic Control and ACCESS, School of Electrical Engineering, KTH, SE-100 44 Stockholm, Sweden. (bo.wahlberg@ee.kth.se

More information

Local Modelling with A Priori Known Bounds Using Direct Weight Optimization

Local Modelling with A Priori Known Bounds Using Direct Weight Optimization Local Modelling with A Priori Known Bounds Using Direct Weight Optimization Jacob Roll, Alexander azin, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet,

More information

EL1820 Modeling of Dynamical Systems

EL1820 Modeling of Dynamical Systems EL1820 Modeling of Dynamical Systems Lecture 9 - Parameter estimation in linear models Model structures Parameter estimation via prediction error minimization Properties of the estimate: bias and variance

More information

LTI Approximations of Slightly Nonlinear Systems: Some Intriguing Examples

LTI Approximations of Slightly Nonlinear Systems: Some Intriguing Examples LTI Approximations of Slightly Nonlinear Systems: Some Intriguing Examples Martin Enqvist, Lennart Ljung Division of Automatic Control Department of Electrical Engineering Linköpings universitet, SE-581

More information

Outline lecture 4 2(26)

Outline lecture 4 2(26) Outline lecture 4 2(26), Lecture 4 eural etworks () and Introduction to Kernel Methods Thomas Schön Division of Automatic Control Linköping University Linköping, Sweden. Email: schon@isy.liu.se, Phone:

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström

CONTROL SYSTEMS, ROBOTICS, AND AUTOMATION - Vol. V - Prediction Error Methods - Torsten Söderström PREDICTIO ERROR METHODS Torsten Söderström Department of Systems and Control, Information Technology, Uppsala University, Uppsala, Sweden Keywords: prediction error method, optimal prediction, identifiability,

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

Control Systems Lab - SC4070 System Identification and Linearization

Control Systems Lab - SC4070 System Identification and Linearization Control Systems Lab - SC4070 System Identification and Linearization Dr. Manuel Mazo Jr. Delft Center for Systems and Control (TU Delft) m.mazo@tudelft.nl Tel.:015-2788131 TU Delft, February 13, 2015 (slides

More information

12. Prediction Error Methods (PEM)

12. Prediction Error Methods (PEM) 12. Prediction Error Methods (PEM) EE531 (Semester II, 2010) description optimal prediction Kalman filter statistical results computational aspects 12-1 Description idea: determine the model parameter

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Gradient-Based Learning. Sargur N. Srihari

Gradient-Based Learning. Sargur N. Srihari Gradient-Based Learning Sargur N. srihari@cedar.buffalo.edu 1 Topics Overview 1. Example: Learning XOR 2. Gradient-Based Learning 3. Hidden Units 4. Architecture Design 5. Backpropagation and Other Differentiation

More information

Linear Regression. Aarti Singh. Machine Learning / Sept 27, 2010

Linear Regression. Aarti Singh. Machine Learning / Sept 27, 2010 Linear Regression Aarti Singh Machine Learning 10-701/15-781 Sept 27, 2010 Discrete to Continuous Labels Classification Sports Science News Anemic cell Healthy cell Regression X = Document Y = Topic X

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,

More information

Lecture 6: Gaussian Mixture Models (GMM)

Lecture 6: Gaussian Mixture Models (GMM) Helsinki Institute for Information Technology Lecture 6: Gaussian Mixture Models (GMM) Pedram Daee 3.11.2015 Outline Gaussian Mixture Models (GMM) Models Model families and parameters Parameter learning

More information

Variational Autoencoders

Variational Autoencoders Variational Autoencoders Recap: Story so far A classification MLP actually comprises two components A feature extraction network that converts the inputs into linearly separable features Or nearly linearly

More information

Gaussian Process for Internal Model Control

Gaussian Process for Internal Model Control Gaussian Process for Internal Model Control Gregor Gregorčič and Gordon Lightbody Department of Electrical Engineering University College Cork IRELAND E mail: gregorg@rennesuccie Abstract To improve transparency

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

A Weighting Method for Approximate Nonlinear System Identification

A Weighting Method for Approximate Nonlinear System Identification Technical report from Automatic Control at Linköpings universitet A Weighting Method for Approximate Nonlinear System Identification Martin Enqvist Division of Automatic Control E-mail: maren@isy.liu.se

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

CS534 Machine Learning - Spring Final Exam

CS534 Machine Learning - Spring Final Exam CS534 Machine Learning - Spring 2013 Final Exam Name: You have 110 minutes. There are 6 questions (8 pages including cover page). If you get stuck on one question, move on to others and come back to the

More information

Imprecise Filtering for Spacecraft Navigation

Imprecise Filtering for Spacecraft Navigation Imprecise Filtering for Spacecraft Navigation Tathagata Basu Cristian Greco Thomas Krak Durham University Strathclyde University Ghent University Filtering for Spacecraft Navigation The General Problem

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik

More information

Hybrid Control and Switched Systems. Lecture #1 Hybrid systems are everywhere: Examples

Hybrid Control and Switched Systems. Lecture #1 Hybrid systems are everywhere: Examples Hybrid Control and Switched Systems Lecture #1 Hybrid systems are everywhere: Examples João P. Hespanha University of California at Santa Barbara Summary Examples of hybrid systems 1. Bouncing ball 2.

More information

p L yi z n m x N n xi

p L yi z n m x N n xi y i z n x n N x i Overview Directed and undirected graphs Conditional independence Exact inference Latent variables and EM Variational inference Books statistical perspective Graphical Models, S. Lauritzen

More information

Linear Regression (continued)

Linear Regression (continued) Linear Regression (continued) Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 6, 2017 1 / 39 Outline 1 Administration 2 Review of last lecture 3 Linear regression

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,

More information

Outline Lecture 2 2(32)

Outline Lecture 2 2(32) Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic

More information

Afternoon Meeting on Bayesian Computation 2018 University of Reading

Afternoon Meeting on Bayesian Computation 2018 University of Reading Gabriele Abbati 1, Alessra Tosi 2, Seth Flaxman 3, Michael A Osborne 1 1 University of Oxford, 2 Mind Foundry Ltd, 3 Imperial College London Afternoon Meeting on Bayesian Computation 2018 University of

More information

Closed-loop Identification of Hammerstein Systems Using Iterative Instrumental Variables

Closed-loop Identification of Hammerstein Systems Using Iterative Instrumental Variables Proceedings of the 18th World Congress The International Federation of Automatic Control Closed-loop Identification of Hammerstein Systems Using Iterative Instrumental Variables Younghee Han and Raymond

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU

More information

Overfitting, Bias / Variance Analysis

Overfitting, Bias / Variance Analysis Overfitting, Bias / Variance Analysis Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms February 8, 207 / 40 Outline Administration 2 Review of last lecture 3 Basic

More information

IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS

IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS IDENTIFICATION OF A TWO-INPUT SYSTEM: VARIANCE ANALYSIS M Gevers,1 L Mišković,2 D Bonvin A Karimi Center for Systems Engineering and Applied Mechanics (CESAME) Université Catholique de Louvain B-1348 Louvain-la-Neuve,

More information

Input-output data sets for development and benchmarking in nonlinear identification

Input-output data sets for development and benchmarking in nonlinear identification Input-output data sets for development and benchmarking in nonlinear identification Torbjörn Wigren Division of Systems and Control, Department of Information Technology, Uppsala University, PO box 337,

More information

On Input Design for System Identification

On Input Design for System Identification On Input Design for System Identification Input Design Using Markov Chains CHIARA BRIGHENTI Masters Degree Project Stockholm, Sweden March 2009 XR-EE-RT 2009:002 Abstract When system identification methods

More information

Kernel Methods. Barnabás Póczos

Kernel Methods. Barnabás Póczos Kernel Methods Barnabás Póczos Outline Quick Introduction Feature space Perceptron in the feature space Kernels Mercer s theorem Finite domain Arbitrary domain Kernel families Constructing new kernels

More information

6.435, System Identification

6.435, System Identification SET 6 System Identification 6.435 Parametrized model structures One-step predictor Identifiability Munther A. Dahleh 1 Models of LTI Systems A complete model u = input y = output e = noise (with PDF).

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Linear Regression and Its Applications

Linear Regression and Its Applications Linear Regression and Its Applications Predrag Radivojac October 13, 2014 Given a data set D = {(x i, y i )} n the objective is to learn the relationship between features and the target. We usually start

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 254 Part V

More information

Pose tracking of magnetic objects

Pose tracking of magnetic objects Pose tracking of magnetic objects Niklas Wahlström Department of Information Technology, Uppsala University, Sweden Novmber 13, 2017 niklas.wahlstrom@it.uu.se Seminar Vi2 Short about me 2005-2010: Applied

More information

Identification, Model Validation and Control. Lennart Ljung, Linköping

Identification, Model Validation and Control. Lennart Ljung, Linköping Identification, Model Validation and Control Lennart Ljung, Linköping Acknowledgment: Useful discussions with U Forssell and H Hjalmarsson 1 Outline 1. Introduction 2. System Identification (in closed

More information

Notes on Noise Contrastive Estimation (NCE)

Notes on Noise Contrastive Estimation (NCE) Notes on Noise Contrastive Estimation NCE) David Meyer dmm@{-4-5.net,uoregon.edu,...} March 0, 207 Introduction In this note we follow the notation used in [2]. Suppose X x, x 2,, x Td ) is a sample of

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Basics of System Identification Guy Dumont Department of Electrical and Computer Engineering University of British Columbia January 2010 Guy Dumont (UBC) EECE574 - Basics of

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

Introduction to Maximum Likelihood Estimation

Introduction to Maximum Likelihood Estimation Introduction to Maximum Likelihood Estimation Eric Zivot July 26, 2012 The Likelihood Function Let 1 be an iid sample with pdf ( ; ) where is a ( 1) vector of parameters that characterize ( ; ) Example:

More information

Accurate Model Identification for Non-Invertible MIMO Sandwich Block-Oriented Processes

Accurate Model Identification for Non-Invertible MIMO Sandwich Block-Oriented Processes Extended abstract for 006 AIChE Annual Meeting, Topic: Process Modeling and Identification, Session: 10B0 Accurate Model Identification for Non-Invertible MIMO Sandwich Block-Oriented Processes Swee-Teng

More information

ESTIMATION ALGORITHMS

ESTIMATION ALGORITHMS ESTIMATIO ALGORITHMS Solving normal equations using QR-factorization on-linear optimization Two and multi-stage methods EM algorithm FEL 3201 Estimation Algorithms - 1 SOLVIG ORMAL EQUATIOS USIG QR FACTORIZATIO

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

Just-in-Time Models with Applications to Dynamical Systems

Just-in-Time Models with Applications to Dynamical Systems Linköping Studies in Science and Technology Thesis No. 601 Just-in-Time Models with Applications to Dynamical Systems Anders Stenman REGLERTEKNIK AUTOMATIC CONTROL LINKÖPING Division of Automatic Control

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

Data Driven Discrete Time Modeling of Continuous Time Nonlinear Systems. Problems, Challenges, Success Stories. Johan Schoukens

Data Driven Discrete Time Modeling of Continuous Time Nonlinear Systems. Problems, Challenges, Success Stories. Johan Schoukens 1/51 Data Driven Discrete Time Modeling of Continuous Time Nonlinear Systems Problems, Challenges, Success Stories Johan Schoukens fyuq (,, ) 4/51 System Identification Data Distance Model 5/51 System

More information

Intelligent Systems:

Intelligent Systems: Intelligent Systems: Undirected Graphical models (Factor Graphs) (2 lectures) Carsten Rother 15/01/2015 Intelligent Systems: Probabilistic Inference in DGM and UGM Roadmap for next two lectures Definition

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Introduction. Chapter 1

Introduction. Chapter 1 Chapter 1 Introduction In this book we will be concerned with supervised learning, which is the problem of learning input-output mappings from empirical data (the training dataset). Depending on the characteristics

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Modeling of Surface EMG Signals using System Identification Techniques

Modeling of Surface EMG Signals using System Identification Techniques Modeling of Surface EMG Signals using System Identification Techniques Vishnu R S PG Scholar, Dept. of Electrical and Electronics Engg. Mar Baselios College of Engineering and Technology Thiruvananthapuram,

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology Massachusetts Institute of Technology 6.867 Machine Learning, Fall 2006 Problem Set 5 Due Date: Thursday, Nov 30, 12:00 noon You may submit your solutions in class or in the box. 1. Wilhelm and Klaus are

More information

Outline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification

Outline. What Can Regularization Offer for Estimation of Dynamical Systems? State-of-the-Art System Identification Outline What Can Regularization Offer for Estimation of Dynamical Systems? with Tianshi Chen Preamble: The classic, conventional System Identification Setup Bias Variance, Model Size Selection Regularization

More information

CHAPTER 10: Numerical Methods for DAEs

CHAPTER 10: Numerical Methods for DAEs CHAPTER 10: Numerical Methods for DAEs Numerical approaches for the solution of DAEs divide roughly into two classes: 1. direct discretization 2. reformulation (index reduction) plus discretization Direct

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 2: Non-parametric methods Henrik Madsen Advanced Time Series Analysis September 206 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes September

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Reminders. Thought questions should be submitted on eclass. Please list the section related to the thought question

Reminders. Thought questions should be submitted on eclass. Please list the section related to the thought question Linear regression Reminders Thought questions should be submitted on eclass Please list the section related to the thought question If it is a more general, open-ended question not exactly related to a

More information